2,758 research outputs found

    Phase Retrieval of Quaternion Signal via Wirtinger Flow

    Full text link
    The main aim of this paper is to study quaternion phase retrieval (QPR), i.e., the recovery of quaternion signal from the magnitude of quaternion linear measurements. We show that all dd-dimensional quaternion signals can be reconstructed up to a global right quaternion phase factor from O(d)O(d) phaseless measurements. We also develop the scalable algorithm quaternion Wirtinger flow (QWF) for solving QPR, and establish its linear convergence guarantee. Compared with the analysis of complex Wirtinger flow, a series of different treatments are employed to overcome the difficulties of the non-commutativity of quaternion multiplication. Moreover, we develop a variant of QWF that can effectively utilize a pure quaternion priori (e.g., for color images) by incorporating a quaternion phase factor estimate into QWF iterations. The estimate can be computed efficiently as it amounts to finding a singular vector of a 4×44\times 4 real matrix. Motivated by the variants of Wirtinger flow in prior work, we further propose quaternion truncated Wirtinger flow (QTWF), quaternion truncated amplitude flow (QTAF) and their pure quaternion versions. Experimental results on synthetic data and color images are presented to validate our theoretical results. In particular, for pure quaternion signal recovery, our quaternion method often succeeds with measurements notably fewer than real methods based on monochromatic model or concatenation model.Comment: 21 pages (paper+supplemental), 6 figure

    Uniform Exact Reconstruction of Sparse Signals and Low-rank Matrices from Phase-Only Measurements

    Full text link
    In phase-only compressive sensing (PO-CS), our goal is to recover low-complexity signals (e.g., sparse signals, low-rank matrices) from the phase of complex linear measurements. While perfect recovery of signal direction in PO-CS was observed quite early, the exact reconstruction guarantee for a fixed, real signal was recently done by Jacques and Feuillen [IEEE Trans. Inf. Theory, 67 (2021), pp. 4150-4161]. However, two questions remain open: the uniform recovery guarantee and exact recovery of complex signal. In this paper, we almost completely address these two open questions. We prove that, all complex sparse signals or low-rank matrices can be uniformly, exactly recovered from a near optimal number of complex Gaussian measurement phases. By recasting PO-CS as a linear compressive sensing problem, the exact recovery follows from restricted isometry property (RIP). Our approach to uniform recovery guarantee is based on covering arguments that involve a delicate control of the (original linear) measurements with overly small magnitude. To work with complex signal, a different sign-product embedding property and a careful rescaling of the sensing matrix are employed. In addition, we show an extension that the uniform recovery is stable under moderate bounded noise. We also propose to add Gaussian dither before capturing the phases to achieve full reconstruction with norm information. Experimental results are reported to corroborate and demonstrate our theoretical results.Comment: 39 pages, 1 figur

    Quantized Low-Rank Multivariate Regression with Random Dithering

    Full text link
    Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data or image restoration.Comment: 16 pages (Submitted
    • …
    corecore